Best Practice for Caching of Single-Path Code
نویسندگان
چکیده
Single-path code has some unique properties that make it interesting to explore different caching and prefetching alternatives for the stream of instructions. In this paper, we explore different cache organizations and how they perform with single-path code. 1998 ACM Subject Classification C.3 Real-Time and Embedded Systems
منابع مشابه
A sequence of targets toward a common best practice frontier in DEA
Original data envelopment analysis models treat decision-making units as independent entities. This feature of data envelopment analysis results in significant diversity in input and output weights, which is irrelevant and problematic from the managerial point of view. In this regard, several methodologies have been developed to measure the efficiency scores based on common weights. Specificall...
متن کاملBounding Pipeline and Instruction Cache Performance
Predicting the execution time of code segments in real-time systems is challenging. Most recently designed machines contain pipelines and caches. Pipeline hazards may result in multicycle delays. Instruction or data memory references may not be found in cache and these misses typically require several cycles to resolve. Whether an instruction will stall due to a pipeline hazard or a cache miss ...
متن کاملIntegrating the Timing Analysis of Pipelining and Instruction Caching
Recently designed machines contain pipelines and caches. While both features provide significant performance advantages, they also pose problems for predicting execution time of code segments in real-time systems. Pipeline hazards may result in multicycle delays. Instruction or data memory references may not be found in cache and these misses typically require several cycles to resolve. Whether...
متن کاملOptimistic Lookup of Whole NFS Paths in a Single Operation
VFS lookup code examines and translates path names one component at a time, checking for special cases such as mount points and symlinks. VFS calls the NFS lookup operation as necessary. NFS employs caching to reduce the number of lookup operations that go to the server. However , when part or all of a path is not cached, NFS lookup operations go back to the server. Although NFS's caching is ee...
متن کاملScalable Highly Available Web Caching
This paper proposes translucent caching as an alternative to transparent caches Translucent caches use the fact that network routing forwards a request for an object along the best path from the client to the object s home server Along this best path routers direct the request toward a cache chosen among a collection of nearby translucent caches Unlike transparent caching which relies on router...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017